27 research outputs found

    Machine Learning-Based Elastic Cloud Resource Provisioning in the Solvency II Framework

    Get PDF
    The Solvency II Directive (Directive 2009/138/EC) is a European Directive issued in November 2009 and effective from January 2016, which has been enacted by the European Union to regulate the insurance and reinsurance sector through the discipline of risk management. Solvency II requires European insurance companies to conduct consistent evaluation and continuous monitoring of risks—a process which is computationally complex and extremely resource-intensive. To this end, companies are required to equip themselves with adequate IT infrastructures, facing a significant outlay. In this paper we present the design and the development of a Machine Learning-based approach to transparently deploy on a cloud environment the most resource-intensive portion of the Solvency II-related computation. Our proposal targets DISAR¼, a Solvency II-oriented system initially designed to work on a grid of conventional computers. We show how our solution allows to reduce the overall expenses associated with the computation, without hampering the privacy of the companies’ data (making it suitable for conventional public cloud environments), and allowing to meet the strict temporal requirements required by the Directive. Additionally, the system is organized as a self-optimizing loop, which allows to use information gathered from actual (useful) computations, thus requiring a shorter training phase. We present an experimental study conducted on Amazon EC2 to assess the validity and the efficiency of our proposal

    An Optimality Approach to the Application Ratio for the Matching Adjustment in the Solvency II Regime

    Get PDF
    We refer to the Technical Specifications provided by EIOPA to implement the package of long-term guarantees measures which shall be included into the Solvency II Framework Directive. One of these regulatory measures concernes the Application Ratio, a coefficient defining what portion of the Maximum Matching Adjustment an insurance company can apply to the risk-free rates for discounting her obligations, given the matching properties of the assigned asset portfolio. In this paper we propose an optimization algorithm providing a reliable assessment of the Application Ratio. The Application Ratio provided by the algorithm is optimal in the sense that it has the maximum value given the structure %matching properties of the asset-liability portfolio. This value corresponds to the minimum attainable level for the losses incurred from forced sales of defaultable bonds with mispriced market value. We show that under natural assumptions this optimality problem has the form of a linear programming problem, which can be easily solved using standard numerical procedures. A matching criterion defined in stronger form can also be applied by imposing appropriate run-off constraints in the linear programming problem. The value of the optimal Application Ratio can be used by a Supervisor as an objective benchmark for checking the appropriateness of the Application Ratio adopted by the undertaking. The optimal liquidation policy provided by the algorithm can also be used by an insurance undertaking which want to apply conservative management actions to her asset-liability portfolio

    Non-Contact Detection of Breathing Using a Microwave Sensor

    Get PDF
    In this paper the use of a continuous-wave microwave sensor as a non-contact tool for quantitative measurement of respiratory tidal volume has been evaluated by experimentation in seventeen healthy volunteers. The sensor working principle is reported and several causes that can affect its response are analyzed. A suitable data processing has been devised able to reject the majority of breath measurements taken under non suitable conditions. Furthermore, a relationship between microwave sensor measurements and volume inspired and expired at quiet breathing (tidal volume) has been found

    Automated Prediction of the Response to Neoadjuvant Chemoradiotherapy in Patients Affected by Rectal Cancer

    Get PDF
    Simple Summary Colorectal cancer is the second most malignant tumor per number of deaths after lung cancer and the third per number of new cases after breast and lung cancer. The correct and rapid identification (i.e., segmentation of the cancer regions) is a fundamental task for correct patient diagnosis. In this study, we propose a novel automated pipeline for the segmentation of MRI scans of patients with LARC in order to predict the response to nCRT using radiomic features. This study involved the retrospective analysis of T-2-weighted MRI scans of 43 patients affected by LARC. The segmentation of tumor areas was on par or better than the state-of-the-art results, but required smaller sample sizes. The analysis of radiomic features allowed us to predict the TRG score, which agreed with the state-of-the-art results. Background: Rectal cancer is a malignant neoplasm of the large intestine resulting from the uncontrolled proliferation of the rectal tract. Predicting the pathologic response of neoadjuvant chemoradiotherapy at an MRI primary staging scan in patients affected by locally advanced rectal cancer (LARC) could lead to significant improvement in the survival and quality of life of the patients. In this study, the possibility of automatizing this estimation from a primary staging MRI scan, using a fully automated artificial intelligence-based model for the segmentation and consequent characterization of the tumor areas using radiomic features was evaluated. The TRG score was used to evaluate the clinical outcome. Methods: Forty-three patients under treatment in the IRCCS Sant'Orsola-Malpighi Polyclinic were retrospectively selected for the study; a U-Net model was trained for the automated segmentation of the tumor areas; the radiomic features were collected and used to predict the tumor regression grade (TRG) score. Results: The segmentation of tumor areas outperformed the state-of-the-art results in terms of the Dice score coefficient or was comparable to them but with the advantage of considering mucinous cases. Analysis of the radiomic features extracted from the lesion areas allowed us to predict the TRG score, with the results agreeing with the state-of-the-art results. Conclusions: The results obtained regarding TRG prediction using the proposed fully automated pipeline prove its possible usage as a viable decision support system for radiologists in clinical practice

    Gaia Early Data Release 3: Structure and properties of the Magellanic Clouds

    Get PDF
    We compare the Gaia DR2 and Gaia EDR3 performances in the study of the Magellanic Clouds and show the clear improvements in precision and accuracy in the new release. We also show that the systematics still present in the data make the determination of the 3D geometry of the LMC a difficult endeavour; this is at the very limit of the usefulness of the Gaia EDR3 astrometry, but it may become feasible with the use of additional external data. We derive radial and tangential velocity maps and global profiles for the LMC for the several subsamples we defined. To our knowledge, this is the first time that the two planar components of the ordered and random motions are derived for multiple stellar evolutionary phases in a galactic disc outside the Milky Way, showing the differences between younger and older phases. We also analyse the spatial structure and motions in the central region, the bar, and the disc, providing new insights into features and kinematics. Finally, we show that the Gaia EDR3 data allows clearly resolving the Magellanic Bridge, and we trace the density and velocity flow of the stars from the SMC towards the LMC not only globally, but also separately for young and evolved populations. This allows us to confirm an evolved population in the Bridge that is slightly shift from the younger population. Additionally, we were able to study the outskirts of both Magellanic Clouds, in which we detected some well-known features and indications of new ones

    Modulators of axonal growth and guidance at the brain midline with special reference to glial heparan sulfate proteoglycans

    Full text link

    Applications of distributed and parallel computing in the solvency II framework: The DISAR system

    No full text
    We address computational problems deriving from Solvency II compliance in the context of Italian life insurance. Solvency II requires insurance undertakings to perform market consistent valuation of technical provisions and continuous monitoring of risks. We examine the case of profit sharing policies with minimum guarantees, which is the most diffused type of life policy in Italy. Market consistent valuation of the complex cash flows generated by these contracts entails modelling of management actions and the use of numerical techniques in a stochastic framework, typically Monte Carlo simulation on a fine grained time grid. Fulfillment of the subsequent highly-demanding computational tasks is possible only by implementing valuation procedures in parallel and distributed architectures. In this work we introduce DISAR, a Solvency II compliant system designed to work on a grid of conventional computers, and discuss its performances. © 2011 Springer-Verlag Berlin Heidelberg

    Sviluppare il mercato delle rendite vitalizie

    No full text
    Viene affrontato il problema di definire i principi e i modi per il calcolo del "prezzo equo'' delle rendite vitalizie rivalutabili, con garanzia di minimo rendimento, ponendosi nella situazione informativa dell'acquirente. Il problema di pricing non puĂČ essere risolto con le tecniche attuariali tradizionali, poichĂ© il processo di rivalutazione della rendita dipende dalla strategia di gestione del fondo di riferimento, e la valutazione - per essere coerente - deve utilizzare un modello stocastico di mercato. Nel lavoro sono proposte "convenzioni'' da applicare al modello stocastico di pricing, che consentano di calcolare in modo adeguato il Money's Worth Ratio, ormai utilizzato a livello internazionale come indice di "confronto della qualitĂ '' tra rendit

    Manuale di finanza - III. Modelli stocastici e contratti derivati

    No full text
    Questo terzo volume Ăš diviso in quattro parti e corredato da appendici. Nella prima e nella seconda parte sono studiati i contratti derivati (forward futures e opzioni); vengono precisate le caratteristiche contrattuali e la loro ragione economica, discussi i limiti posti ai prezzi dal principio di arbitraggio, delineati alcuni problemi rilevanti per la vigilanza su operatori e mercato. La terza parte affronta i temi dell'option pricing theory. La struttura e le logiche di utilizzazione dei modelli stocastici di pricing, introdotte col modello di Cox, Ross e Rubinstein, sono discusse in riferimento al modello di Black e Sholes, segnalando anche interessanti collegamenti col Capital asset pricing model. La parte quarta riguarda i modelli di valutazione dei contratti che dipendono in modo essenziale dai tassi di interesse; il modello di Cox, Ingersoll e Ross Ăš il riferimento per discutere i fondamenti della valutazione in condizioni di incertezza, le logiche di applicazione ai mercati - obbligazionari e dei derivati obbligazionari -, per sviluppare confronti con altri modelli (di Vasicek e "alla Black"), per estendere lo schema di pricing ai contratti legati all'inflazione
    corecore